Pape之DL之CNN:2019《A Survey of the Recent Architectures of Deep Convolutional Neural Networks》翻译并解读1~3 您所在的位置:网站首页 make a survey about翻译 Pape之DL之CNN:2019《A Survey of the Recent Architectures of Deep Convolutional Neural Networks》翻译并解读1~3

Pape之DL之CNN:2019《A Survey of the Recent Architectures of Deep Convolutional Neural Networks》翻译并解读1~3

#Pape之DL之CNN:2019《A Survey of the Recent Architectures of Deep Convolutional Neural Networks》翻译并解读1~3| 来源: 网络整理| 查看: 265

        In the year 2016, the width of the network was also explored in connection with depth to improve feature learning [34], [35]. Apart from this, no new architectural modification became prominent but instead, different researchers used hybrid of the already proposed architectures to improve deep CNN performance [33], [104]–[106], [109], [110]. This fact gave the intuition that there might be other factors more important as compared to the appropriate assembly of the network units that can effectively regulate CNN performance. In this regard, Hu et al. (2017) identified that the network representation has a role in learning of deep CNNs [111]. Hu et al. introduced the idea of feature map exploitation and pinpointed that less informative and domain extraneous features may affect the performance of the network to a larger extent. He exploited the aforementioned idea and proposed new architecture named as Squeeze and Excitation Network (SE-Network) [111]. It exploits feature map (commonly known as channel in literature) information by designing a specialized SE-block. This block assigns weight to each feature map depending upon its contribution in class discrimination. This idea was further investigated by different researchers, which assign attention to important regions by exploiting both spatial and feature map (channel) information [37], [38], [112]. In 2018, a new idea of channel boosting was introduced by Khan et al [36]. The motivation behind the training of network with boosted channel representation was to use an enriched representation. This idea effectively boost the performance of a CNN by learning diverse features as well as exploiting the already learnt features through the concept of TL.



【本文地址】

公司简介

联系我们

今日新闻

    推荐新闻

    专题文章
      CopyRight 2018-2019 实验室设备网 版权所有